With more than a billion active monthly users worldwide, TikTok is a popular social networking app for creating and sharing short videos. The platform is particularly popular with young people. 25% of TikTok users in the U.S. are between the ages of 10-19 and 43% of the global user base is between 18-24 years old.
However, with that widespread popularity comes a host of problems that the company is struggling to resolve—especially protecting children from harm on its platform. TikTok has come under fire in the past several years for predators easily accessing and grooming children as well as for hosting sexually explicit and pornographic content. These issues have been documented extensively by National Center on Sexual Exploitation researchers (a sample of the evidence can be found here).
The Dangers of Sexual Exploitation on TikTok
Several major media outlets have been investigating and reporting on the extensive harms on TikTok with Forbes calling the app “a magnet to sexual predators” and Wall Street Journal posting a series of stories in the past half year about TikTok’s algorithms serving up pornography and drugs to minors, content promoting eating disorders, dangerous viral trends, and negative mental health impact on teen girls using the platform—especially on those posting “suggestive” videos.
TikTok becoming known as a “hunting ground” for predators has not stopped its growth, and the platform has given predators easy access to groom, abuse, and traffic children. Exploiters use TikTok to view minor users, comment on videos, and message children where they are often requesting or sending sexually explicit videos or pictures.
In the fall of 2021, TikTok executives were called to testify regarding failures to protect young children in online spaces as a part of Congressional hearings that also scrutinized similar issues on Facebook/Instagram, Snapchat, and YouTube.
In April 2022, the Financial Times reported that the U.S. Department of Homeland Security is investigating how TikTok handles child sexual abuse material (CSAM, sometimes referred to as “child pornography”), while the Department of Justice is also reviewing how a specific privacy feature on TikTok is being exploited by predators on the app. The article states:
“’It is a perfect place for predators to meet, groom and engage children,’ said Erin Burke, unit chief of the child exploitation investigations unit at Homeland Security’s cyber crime division, calling it the ‘platform of choice’ for the behaviour.”
Investigations into these issues reveal a larger problem with not just TikTok, but social media platforms in general and how they handle moderation. Many companies like Meta employ over 15,000 human moderators in addition to AI software that can catch known instances of CSAM (or “hashed” CSAM) but, even then, the rising tide of child sexual abuse and other forms of exploitation is overwhelming.
The Internet Watch Foundation noted that 2021 was the “worst year on record” for online child sexual abuse. Yet the National Center for Missing and Exploited Children (NCMEC), which handles reports of child abuse from these platforms, found that TikTok made only 155,000 reports in the last year compared to Instagram’s 3.4 million reports.
Clearly TikTok’s “zero-tolerance” policy for child sexual abuse material and other inappropriate content is not working as live videos appearing on feeds featuring nudity and sex acts, clever hashtags and spellings to work around bans in order to promote dangerous and exploitative behavior like OnlyFans, adult predators targeting and grooming young children, and indicators of CSAM trading abound. The Financial Times explored some of the patterns of behavior that allow this to happen on TikTok:
“One pattern that the Financial Times verified with law enforcement and child safety groups was content being procured and traded through private accounts, by sharing the password with victims and other predators. Key code words are used in public videos, user names and biographies, but the illegal content is uploaded using the app’s ‘Only Me’ function where videos are only visible for those logged into the profile.”
Solutions for Combating Sexual Abuse and Exploitation on TikTok
TikTok has implemented several of our recommendations to significantly improve their safety features for minors, such as disabling direct messaging for those under 16 and allowing parents to lock caregiver controls with a pin code. These features go beyond what any other social media platforms have implemented to date. TikTok has also released extensive Community Guidelines, clearly defining terms and listing activities and content prohibited on the platform, including content that “depicts, promotes, or glorifies” prostitution or pornography, content that simulates sexual activity (either verbally, in text, or even through emojis), or non-consensual sex.
Despite these changes, it is clear that TikTok needs to do much more to protect minors from predators, exposure to sexual content, and pornographic websites. We (and others like the National Association of Attorney Generals) have been pressing on TikTok to give more control to parents and to proactively moderate content because even our own research confirms what the Wall Street Journal found: under an account we created as being for a 13-year-old, we were easily able to find videos promoting OnlyFans, as well as other pornography and prostitution sites despite the fact that this type of material is against TikTok’s Community Guidelines.
TikTok and all social media companies must be accountable for the environments they create, especially when their insufficient policies and practices leave so much room for exploitation, abuse, and harm.